Skip to content

[AI Projects] Fix code-based evaluator catalog sample: add missing data_mapping#46824

Draft
slister1001 wants to merge 1 commit into
Azure:mainfrom
slister1001:fix/code-evaluator-sample-data-mapping
Draft

[AI Projects] Fix code-based evaluator catalog sample: add missing data_mapping#46824
slister1001 wants to merge 1 commit into
Azure:mainfrom
slister1001:fix/code-evaluator-sample-data-mapping

Conversation

@slister1001
Copy link
Copy Markdown
Member

Fix code-based evaluator catalog sample

The sample sample_eval_catalog_code_based_evaluators.py currently fails at client.evals.create(...) with HTTP 400:

UserError / EvalValidationFailed
Code: MissingRequiredDataMapping
Target: group.testing_criteria[0].data_mapping
Message: Data mapping for required field 'item' is missing for evaluator 'my_custom_evaluator_code'.

Root cause

The evaluator's data_schema declares "required": ["item"] (the grade(sample, item) function expects the whole item object), but the testing_criteria entry omitted data_mapping entirely. The server validates that every field in data_schema.required has a corresponding mapping in the testing criterion.

For comparison, the working sample_eval_catalog_prompt_based_evaluators.py declares each field individually in data_schema and maps each one in data_mapping. The code-based sample needs to map the single required item field to the whole data-source item.

Fixes

  1. Add data_mapping={"item": "{{item}}"} to the testing criterion — passes the entire data-source item to the evaluator's item parameter.
  2. Change pass_threshold type in init_parameters.properties from "string" to "number" — the runtime value 0.5 is a float, not a string. After the data-mapping fix is in place, this would be the next validation failure.

Sample diff

-                    "properties": {"deployment_name": {"type": "string"}, "pass_threshold": {"type": "string"}},
+                    "properties": {"deployment_name": {"type": "string"}, "pass_threshold": {"type": "number"}},
...
         TestingCriterionAzureAIEvaluator(
             type="azure_ai_evaluator",
             name="my_custom_evaluator_code",
             evaluator_name="my_custom_evaluator_code",
+            data_mapping={"item": "{{item}}"},
             initialization_parameters={
                 "deployment_name": f"{model_deployment_name}",
                 "pass_threshold": 0.5,
             },
         )

Reproduction

Without this fix, running the sample with azure-ai-projects==2.1.0 against an Azure AI Foundry project produces the 400 error shown above. With this fix, evaluation creation succeeds.

Scope

Sample-only change. No SDK source code, generated code, public surface, or tests are affected.

The sample at sample_eval_catalog_code_based_evaluators.py failed with HTTP 400
'MissingRequiredDataMapping: Data mapping for required field item is missing'
because the testing criterion had no data_mapping for the evaluator's required
'item' field.

Also correct pass_threshold init_parameter schema from 'string' to 'number' to
match the float value (0.5) actually passed at runtime.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant